skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Search for: All records

Creators/Authors contains: "Sharp, Richard R"

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. BackgroundAs artificial intelligence (AI) tools are integrated more widely in psychiatric medicine, it is important to consider the impact these tools will have on clinical practice. ObjectiveThis study aimed to characterize physician perspectives on the potential impact AI tools will have in psychiatric medicine. MethodsWe interviewed 42 physicians (21 psychiatrists and 21 family medicine practitioners). These interviews used detailed clinical case scenarios involving the use of AI technologies in the evaluation, diagnosis, and treatment of psychiatric conditions. Interviews were transcribed and subsequently analyzed using qualitative analysis methods. ResultsPhysicians highlighted multiple potential benefits of AI tools, including potential support for optimizing pharmaceutical efficacy, reducing administrative burden, aiding shared decision-making, and increasing access to health services, and were optimistic about the long-term impact of these technologies. This optimism was tempered by concerns about potential near-term risks to both patients and themselves including misguiding clinical judgment, increasing clinical burden, introducing patient harms, and creating legal liability. ConclusionsOur results highlight the importance of considering specialist perspectives when deploying AI tools in psychiatric medicine. 
    more » « less
  2. Abstract Pharmacogenomic (PGx) biomarkers integrated using machine learning can be embedded within the electronic health record (EHR) to provide clinicians with individualized predictions of drug treatment outcomes. Currently, however, drug alerts in the EHR are largely generic (not patient‐specific) and contribute to increased clinician stress and burnout. Improving the usability of PGx alerts is an urgent need. Therefore, this work aimed to identify principles for optimal PGx alert design through a health‐system‐wide, mixed‐methods study. Clinicians representing multiple practices and care settings (N = 1062) in urban, rural, and underserved regions were invited to complete an electronic survey comparing the usability of three drug alerts for citalopram, as a case study. Alert 1 contained a generic warning of pharmacogenomic effects on citalopram metabolism. Alerts 2 and 3 provided patient‐specific predictions of citalopram efficacy with varying depth of information. Primary outcomes included the System's Usability Scale score (0–100 points) of each alert, the perceived impact of each alert on stress and decision‐making, and clinicians' suggestions for alert improvement. Secondary outcomes included the assessment of alert preference by clinician age, practice type, and geographic setting. Qualitative information was captured to provide context to quantitative information. The final cohort comprised 305 geographically and clinically diverse clinicians. A simplified, individualized alert (Alert 2) was perceived as beneficial for decision‐making and stress compared with a more detailed version (Alert 3) and the generic alert (Alert 1) regardless of age, practice type, or geographic setting. Findings emphasize the need for clinician‐guided design of PGx alerts in the era of digital medicine. 
    more » « less